Web Survey Bibliography
At the workshop I will present objectives, procedures and results from a pilot-study.
Background
Conventionally, data editing is initiated subsequent to data collection. The preliminary object of post data collection data editing, is to identify possibly defective data. The object proper is to correct errors and improve data quality. Correction of possible defective data may involve costly manual record checks, burdening re-contact with individual businesses/respondents, and possible bias from automatic imputation and statistical editing. In order to minimise costs and burden, the cut off level for possibly defective input data, which is actually edited, may be set low, resulting in low quality output data.
Pilot survey
Data collection for statistics on Vacant Positions for ESS is fairly non-complex and straight forward - compared to most business statistics at Statistics Denmark. In the questionnaire two questions are asked: Number of vacant positions, and – as a reference variable for data editing - number of persons currently employed. Previously, data editing for this survey was initiated post data collection: For each work unit in the sample, the reported number of vacant positions was compared to the reported number of persons employed. The reported number of employees was also compared to the number of employees recorded in the survey 12 months earlier, and to the number of employees currently recorded for the work unit in the Danish Central Business Register. This was done in order to check, whether the responding businesses had reported vacant positions for the intended work unit (and e.g. not for the whole enterprise). Correction of possible defective data involved manual record checks, re-contact to respondents by phone and statistical imputation and manipulation.
Advanced dynamic online data editing
At Statistics Denmark, Survey and Methods the vacant positions survey was selected as a pilot for “advanced dynamic online data editing”. An online questionnaire was developed, where some of the “old” post data collection edit checks for cross reference were incorporated and run dynamically - as the respondent enters data in the form. The questionnaire for each individual work unit was pre-loaded with: Number of employees recorded in the survey 12 months earlier and number of employees currently recorded in Danish Central Business Register. These values were not actually visible to the respondent in the form, but only used as reference data. If data is entered in the form, which does not comply with the “old” post data collection edit checks, then a warning is generated, and the respondent is asked to re-consider and either modify the response or enter a comment/explanation.
Workshop Homepage (abstract) / (presentation)
Web survey bibliography (4086)
- Current Projects at University of Ljubljana; 2011; Lozar Manfreda, K.
- Collecting information with Knowledge technologies; 2011; Foulonneau, M.
- E-dater, Artificial Actors, and German Households; 2011; Hebing, M.
- DIME-SHS; 2011; Lesnard, L.
- Web based Data Collection – A Jungle becoming a Field or a Field becoming a Jungle? ; 2011; Ole Finnemann, N.
- Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys 2011; 2011
- Calibrating Non-Probability Internet Samples with Probability Samples Using Early Adopter Characteristics...; 2011; DiSogra, C., Cobb, C. L., Chan, E., Dennis, J. M.
- How Visual Design Affects the Interpretability of Survey Questions; 2011; Toepoel, V., Dillman, D. A.
- True Longitudinal and Probability-Based Internet Panels: Evidence from the Netherlands; 2011; Das, M., Scherpenzeel, A.
- Web Survey Methodology: Interface Design, Sampling and Statistical Inference; 2011; Couper, M. P.
- Effect of interview modes on measurement of identity; 2011; Nandi, A., Platt, L.
- Maintaining Cross-Sectional Representativeness in a Longitudinal General Population Survey ; 2011; Lynn, P.
- Understanding Society Innovation Panel Wave 3: Results from Methodological Experiments; 2011; Burton, J., Budd, S., Gilbert, E., Jaeckle, A., McFall, S., Uhrig, S. C. N.
- The Effect of a Mixed Mode Wave on Subsequent Attrition in a Panel Survey: Evidence from the Understanding...; 2011; Lynn, P.
- Seeing Through the Eyes of the Respondent: An Eye-tracking Study on Survey Question Comprehension; 2011; Lenzner, A., Kaczmirek, L., Galesic, M.
- Eye Tracking in testing questionnaires: What’s the added value?; 2011; Tries, S.
- Panel Recruitment via Facebook; 2011; Toepoel, V.
- Usability and burden measurement in online forms; 2011; Thomsen, P.
- Dynamic Data Editing in online data collection for the Vacant Positions Survey; 2011; Stax, H.-P.
- Utilizing Web Technology in Business Data Collection: Some Norwegian, Dutch and Danish Experiences; 2011; Snijkers, G., Haraldsen, G., Stax, H.-P.
- Web survey software; 2011; Slavec, A., Berzelak, N., Vehovar, V.
- Disentangling relative mode effects for the web survey mode in the Safety Monitor; 2011; Schouten, B., van de Brakel, J., Buelens, B., Klausch, L. T., van der Laan, J.
- Improving validity in web surveys with hard‐to‐reach targets: Online Respondent Driven Sampling...; 2011; Mavletova, A. M.
- Developing Electronic Questionnaires at Statistics Canada: Experiences and Challenges in a Changing...; 2011; Lawrence, D.
- Experiences with mixed mode mail & web-enquêtes in probability samples with known individuals; 2011; Kalgraff Skjak, K., Kolsrud, K.
- Effects of internet data collection in business surveys – the case of the Dutch SBS; 2011; Giesen, D.
- Ignoring the compatibility of online questionnaires may bias the psychological composition of your sample...; 2011; Funke, F.
- Video enhanced web survey; 2011; Fuchs, M., Kunz, T., Gebhard, F.
- Keeping Up Appearances: Maintaining standards during strategic changes in electronic reporting; 2011; Farrell, E., Hewett, K.
- Respondent engagement: using usability testing; 2011; Dowling, Z.
- Scrolling or paging - it depends; 2011; Blanke, K.
- Research Applications for Mobile Data Collection; 2011; Fawson, B.
- Results from Real-Time Data Collection (RTD) vs. Data from Traditional Panelists: Is it valid to combine...; 2011; Pingitore, G., Witten, S., Walker, A., Seldin, D., Ellrodt, R., Muns, N., Parks, C., Serrato, C.
- Widening the Net: A Comparison of Online Intercept and Access Panel Sampling; 2011; Bakken, D. G., Nawani, R.
- Making it fit: how survey technology providers are responding to the challenges of handling web surveys...; 2011; Macer, T.
- Probably the Best Bias in the World?; 2011; Dent, T.
- Optimus Modus: Comparing interviewing modes for visitor surveys; 2011; Stanley, N., Jenkins, S.
- The development of the KubeMatrix™ as a mobile app for Market Research Online Communities; 2011; Birks, D. F., Wilson, De.
- Online Research – Game On!: A look at how gaming techniques can transform your online research; 2011; Puleston, J.
- Engagement, Consistency, Reach – why the Technology Landscape Precludes All Three; 2011; Johnson, A., Rolfe, G.
- The use of paradata to improve data collection at Statistics Canada: Empirical results and research; 2011; Gambino, J., Wrighte, D.
- Medium Node: NSF Census Research Network; 2011; McCutcheon, A. L., Belli, R. F., Olson, K., Smyth, J. D., Soh, L.-K.
- A new online building survey system; 2011; Wang, Yic.
- A Comparison of Internet-Based Participant Recruitment Methods: Engaging the Hidden Population of Cannabis...; 2011; Temple, E. C., Brown, R. F.
- The German Access Panel and the Impact of Response Propensities; 2011; Amarov, B., Enderle, T., Muennich, R., Rendtel, U., Zins, S.
- Web Survey Process within the Concept of eSocial Sciences; 2011; Vehovar, V.
- Can biomarkers be collected in an Internet survey? A pilot study in the LISS panel; 2011; Avendano, M., Mackenbach, J., Scherpenzeel, A.
- Innovations in survey sampling design: Discussion of three contributions presented at the U.S. Census...; 2011; Opsomer, J.
- A Bayesian analysis of small area probabilities under a constraint; 2011; Nandram, B., Sayit, H.
- Adaptive network and spatial sampling; 2011; Thompson, S. K.